help #239
srinadhkesineni
started this conversation in
General
help
#239
Replies: 1 comment
-
i couldn't figure out what to do now. Any one help me |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
AgenticSeek is ready.
➤➤➤ open file manager
Selected agent: Browser (roles: web)
▃▉▉▉▃▁ Thinking...Traceback (most recent call last):
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 100, in respond
thought = llm(history, verbose)
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 214, in ollama_fn
raise e
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 197, in ollama_fn
for chunk in stream:
File "C:\Users\srinadh\AppData\Local\Programs\Python\Python310\lib\site-packages\ollama_client.py", line 170, in inner
raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: model is required (status code: 400)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\Users\srinadh\agenticSeek\cli.py", line 78, in
asyncio.run(main())
File "C:\Users\srinadh\AppData\Local\Programs\Python\Python310\lib\asyncio\runners.py", line 44, in run
return loop.run_until_complete(main)
File "C:\Users\srinadh\AppData\Local\Programs\Python\Python310\lib\asyncio\base_events.py", line 641, in run_until_complete
return future.result()
File "C:\Users\srinadh\agenticSeek\cli.py", line 72, in main
raise e
File "C:\Users\srinadh\agenticSeek\cli.py", line 66, in main
if await interaction.think():
File "C:\Users\srinadh\agenticSeek\sources\interaction.py", line 162, in think
self.last_answer, self.last_reasoning = await agent.process(self.last_query, self.speech)
if await interaction.think():
File "C:\Users\srinadh\agenticSeek\sources\interaction.py", line 162, in think
self.last_answer, self.last_reasoning = await agent.process(self.last_query, self.speech)
File "C:\Users\srinadh\agenticSeek\sources\interaction.py", line 162, in think
self.last_answer, self.last_reasoning = await agent.process(self.last_query, self.speech)
self.last_answer, self.last_reasoning = await agent.process(self.last_query, self.speech)
File "C:\Users\srinadh\agenticSeek\sources\agents\browser_agent.py", line 342, in process
ai_prompt, reasoning = await self.llm_request()
File "C:\Users\srinadh\agenticSeek\sources\agents\browser_agent.py", line 342, in process
ai_prompt, reasoning = await self.llm_request()
ai_prompt, reasoning = await self.llm_request()
File "C:\Users\srinadh\agenticSeek\sources\agents\agent.py", line 164, in llm_request
File "C:\Users\srinadh\agenticSeek\sources\agents\agent.py", line 164, in llm_request
return await loop.run_in_executor(self.executor, self.sync_llm_request)
File "C:\Users\srinadh\AppData\Local\Programs\Python\Python310\lib\concurrent\futures\thread.py", line 52, in run
return await loop.run_in_executor(self.executor, self.sync_llm_request)
File "C:\Users\srinadh\AppData\Local\Programs\Python\Python310\lib\concurrent\futures\thread.py", line 52, in run
File "C:\Users\srinadh\AppData\Local\Programs\Python\Python310\lib\concurrent\futures\thread.py", line 52, in run
result = self.fn(*self.args, **self.kwargs)
File "C:\Users\srinadh\agenticSeek\sources\agents\agent.py", line 171, in sync_llm_request
thought = self.llm.respond(memory, self.verbose)
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 116, in respond
raise Exception(f"Provider {self.provider_name} failed: {str(e)}") from e
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 116, in respond
raise Exception(f"Provider {self.provider_name} failed: {str(e)}") from e
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 116, in respond
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 116, in respond
raise Exception(f"Provider {self.provider_name} failed: {str(e)}") from e
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 116, in respond
raise Exception(f"Provider {self.provider_name} failed: {str(e)}") from e
Exception: Provider ollama failed: model is required (status code: 400)
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 116, in respond
raise Exception(f"Provider {self.provider_name} failed: {str(e)}") from e
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 116, in respond
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 116, in respond
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 116, in respond
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 116, in respond
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 116, in respond
raise Exception(f"Provider {self.provider_name} failed: {str(e)}") from e
Exception: Provider ollama failed: model is required (status code: 400)
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 116, in respond
raise Exception(f"Provider {self.provider_name} failed: {str(e)}") from e
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 116, in respond
File "C:\Users\srinadh\agenticSeek\sources\llm_provider.py", line 116, in respond
raise Exception(f"Provider {self.provider_name} failed: {str(e)}") from e
Exception: Provider ollama failed: model is required (status code: 400)
PS C:\Users\srinadh\agenticSeek>
Beta Was this translation helpful? Give feedback.
All reactions